Goto

Collaborating Authors

 novel form


The Download: a novel form of censorship in China, and a self-taught robot dog

MIT Technology Review

Imagine you are working on your novel on your home computer. It's nearly finished; you have already written approximately one million words. All of a sudden, the online word processing software tells you that you can no longer open the draft because it contains illegal information. Within an instant, all your words are lost. This is what happened in June to a Chinese novelist writing under the alias Mitu.


Beyond research data infrastructures: exploiting artificial & crowd i…

#artificialintelligence

Web pages indexed by Google (plus gazillion of temporal snapshots) Embedded markup (RDFa, Microdata, Microformats) for annotation of Web pages Supports Web search & interpretation Pushed by Google, Yahoo, Bing et al (schema.org Factual errors, annotation errors (see also [Meusel et al, ESWC2015]) o Ambiguity & coreferences. Relevance: supervised coreference resolution 2.) Quality & redundancy: data fusion through supervised fact classification (SVM, knn, RF, LR, NB), diverse feature set (authority, relevance etc), considering source- (eg PageRank), entity-, & fact-level KnowMore: data fusion on markup 02/10/19 11 1. Relevance: supervised coreference resolution 2.) Quality & redundancy: data fusion through supervised fact classification (SVM, knn, RF, LR, NB), diverse feature set (authority, relevance etc), considering source- (eg PageRank), entity-, & fact-level KnowMore: data fusion on markup 02/10/19 12 1. Rich Context & Coleridge Initiative building (yet another) KG of scholarly resources & datasets 13Stefan Dietze Context/corpus: publications (currently: social sciences, SAGE Publishing) Tasks: I. Extraction/disambiguation of dataset mentions II.


A Short-Term Memory Architecture for the Learning of Morphophonemic Rules

Gasser, Michael, Lee, Chan-Do

Neural Information Processing Systems

In the debate over the power of connectionist models to handle linguistic phenomena, considerable attention has been focused on the learning of simple morphological rules. It is a straightforward matter in a symbolic system to specify how the meanings of a stem and a bound morpheme combine to yield the meaning of a whole word and how the form of the bound morpheme depends on the shape of the stem. In a distributed connectionist system, however, where there may be no explicit morphemes, words, or rules, things are not so simple. The most important work in this area has been that of Rumelhart and McClelland (1986), together with later extensions by Marchman and Plunkett (1989). The networks involved were trained to associate English verb stems with the corresponding past-tense forms, successfully generating both regular and irregular forms and generalizing to novel inputs.


A Short-Term Memory Architecture for the Learning of Morphophonemic Rules

Gasser, Michael, Lee, Chan-Do

Neural Information Processing Systems

In the debate over the power of connectionist models to handle linguistic phenomena, considerable attention has been focused on the learning of simple morphological rules. It is a straightforward matter in a symbolic system to specify how the meanings of a stem and a bound morpheme combine to yield the meaning of a whole word and how the form of the bound morpheme depends on the shape of the stem. In a distributed connectionist system, however, where there may be no explicit morphemes, words, or rules, things are not so simple. The most important work in this area has been that of Rumelhart and McClelland (1986), together with later extensions by Marchman and Plunkett (1989). The networks involved were trained to associate English verb stems with the corresponding past-tense forms, successfully generating both regular and irregular forms and generalizing to novel inputs.